Training Recurrent Neural Networks with Temporal Input Encodings
نویسندگان
چکیده
| We investigate the learning of de-terministic nite-state automata (DFA's) with recurrent networks with a single input neu-ron, where each input symbol is represented as a temporal pattern and strings as sequences of temporal patterns. We empirically demonstrate that obvious temporal encodings can make learning very diicult or even impossible. Based on preliminary results, we formulate some hypotheses about 'good' temporal encoding, i.e. encodings which do not sig-niicantly increase training time compared to training of networks with multiple input neu-rons.
منابع مشابه
Fractal Learning Neural Networks
One of the fundamental challenges to using recurrent neural networks (RNNs) for learning natural languages is the difficulty they have with learning complex temporal dependencies in symbol sequences. Recently, substantial headway has been made in training RNNs to process languages which can be handled by the use of one or more symbolcounting stacks (e.g., anbn, anAmBmbn, anbncn) with compelling...
متن کاملMulti-Step-Ahead Prediction of Stock Price Using a New Architecture of Neural Networks
Modelling and forecasting Stock market is a challenging task for economists and engineers since it has a dynamic structure and nonlinear characteristic. This nonlinearity affects the efficiency of the price characteristics. Using an Artificial Neural Network (ANN) is a proper way to model this nonlinearity and it has been used successfully in one-step-ahead and multi-step-ahead prediction of di...
متن کاملThe Application of Multi-Layer Artificial Neural Networks in Speckle Reduction (Methodology)
Optical Coherence Tomography (OCT) uses the spatial and temporal coherence properties of optical waves backscattered from a tissue sample to form an image. An inherent characteristic of coherent imaging is the presence of speckle noise. In this study we use a new ensemble framework which is a combination of several Multi-Layer Perceptron (MLP) neural networks to denoise OCT images. The noise is...
متن کاملمعرفی شبکه های عصبی پیمانه ای عمیق با ساختار فضایی-زمانی دوگانه جهت بهبود بازشناسی گفتار پیوسته فارسی
In this article, growable deep modular neural networks for continuous speech recognition are introduced. These networks can be grown to implement the spatio-temporal information of the frame sequences at their input layer as well as their labels at the output layer at the same time. The trained neural network with such double spatio-temporal association structure can learn the phonetic sequence...
متن کاملTraining and Analyzing Deep Recurrent Neural Networks
Time series often have a temporal hierarchy, with information that is spread out over multiple time scales. Common recurrent neural networks, however, do not explicitly accommodate such a hierarchy, and most research on them has been focusing on training algorithms rather than on their basic architecture. In this paper we study the effect of a hierarchy of recurrent neural networks on processin...
متن کامل